feat(poc): replace Docker-based smoke tests with native Node.js harness#2231
feat(poc): replace Docker-based smoke tests with native Node.js harness#2231rostalan wants to merge 8 commits intoredhat-developer:mainfrom
Conversation
|
Skipping CI for Draft Pull Request. |
|
/publish |
|
PR action ( |
|
|
|
The file You should update the |
| } | ||
|
|
||
| // --------------------------------------------------------------------------- | ||
| // OCI Download (mirrors install-dynamic-plugins.py §663-715) |
There was a problem hiding this comment.
The comment itself acknowledges this mirrors install-dynamic-plugins.py. Should we just call the Python script directly as a pre-step instead of reimplementing it in JS?
The overlay repo currently gets the script from the RHDH container image — by removing Docker, we lose access to it. But we could copy the script here (with a version marker) as a short-term solution. That would:
- Eliminate ~60 lines of JS OCI code
- Inherit all security protections for free (zip bomb, symlink traversal, integrity checks)
- Leverage the 130KB+ test suite that already covers this logic
Longer-term options: publish as a pip package or container tool so any repo can consume it.
For context, there's also a parallel POC in RHDH core (redhat-developer/rhdh#4523) that extends the same Python script with parallel downloads (ThreadPoolExecutor) achieving ~34% speedup. Reusing it here would get that performance benefit too.
There was a problem hiding this comment.
for now, I added a step that downloads install-dynamic-plugins script and its requirements and uses it directly. Not sure if the would benefit from the parallel downloads, since it only runs on a single workspace, so only a couple OCIs at a time.
|
|
||
| function extractPlugin(tarFile, pluginPath, dest) { | ||
| mkdirSync(join(dest, pluginPath), { recursive: true }); | ||
| execSync(`tar xf "${tarFile}" -C "${dest}" "${pluginPath}/"`, { |
There was a problem hiding this comment.
This extracts tar contents with a bare execSync('tar xf ...') — no validation of archive members before extraction. The Python script in RHDH core (install-dynamic-plugins.py) checks for:
- Zip bomb:
member.size > MAX_ENTRY_SIZE(20MB default per entry) - Symlink traversal:
os.path.realpath()validation against the destination directory - Hardlink traversal: same check
- Device files/FIFOs: rejected entirely
- Safe tar filter:
tar.extractall(..., filter='tar')
Even in CI, a crafted OCI layer could exploit path traversal via symlinks. If we keep the JS implementation, all of these checks need to be ported.
There was a problem hiding this comment.
replaced by python script.
| if (existsSync(p)) process.argv.push("--config", p); | ||
| } | ||
|
|
||
| const { createBackend } = await import("@backstage/backend-defaults"); |
There was a problem hiding this comment.
createBackend() is the production API. Backstage provides startTestBackend() from @backstage/backend-test-utils specifically for this use case — it handles lifecycle management, automatic cleanup, built-in in-memory SQLite, and exposes mock services. It would also reduce the manual plugin registration below (~20 lines of backend.add() calls).
redhat-developer/rhdh#4523 uses startTestBackend() for the same plugin loadability validation and it works well.
There was a problem hiding this comment.
This was meant to keep the environment somewhat real, but I guess it makes sense to have this as a simple "plugin loads" check rather than anything more complicated.
| // Config generation | ||
| // --------------------------------------------------------------------------- | ||
|
|
||
| function deepMerge(src, dst) { |
There was a problem hiding this comment.
This reimplements config deep-merge that already exists in install-dynamic-plugins.py (the merge() function). The Python version also detects key conflicts and raises errors — this one silently overwrites. If we use the Python script as a pre-step, this function becomes unnecessary since the script already generates app-config.dynamic-plugins.yaml with all plugin configs merged.
There was a problem hiding this comment.
replaced with python script
| ); | ||
|
|
||
| backend.add( | ||
| createBackendPlugin({ |
There was a problem hiding this comment.
This probe plugin concept is great — using dynamicPluginsServiceRef to verify frontend plugins actually loaded at runtime is more thorough than filesystem-only checks. Worth keeping regardless of the OCI download approach. The sanity check POC in RHDH core (redhat-developer/rhdh#4523) currently only validates bundles via filesystem (dist-scalprum/plugin-manifest.json); this runtime probe approach would complement it nicely.
| }); | ||
| } | ||
|
|
||
| async function downloadPlugins(plugins, dest) { |
There was a problem hiding this comment.
Downloads are sequential here. The install-dynamic-plugins-fast.py in redhat-developer/rhdh#4523 uses ThreadPoolExecutor for parallel OCI downloads with a shared image cache (one download per unique image, not per plugin), achieving ~34% speedup. Another reason to reuse the existing script rather than maintaining a separate implementation.
There was a problem hiding this comment.
replaced with the basic python script, same reason as above.
a88d461 to
a827513
Compare
|
The file You should update the |
a827513 to
b23aead
Compare
|
/publish |
|
PR action ( |
|
/publish |
|
PR action ( |
|
/publish |
|
PR action ( |
eec5ff0 to
855f65d
Compare
| - name: Log in to GitHub Container Registry | ||
| uses: docker/login-action@4907a6ddec9925e35a0a9e82d7399ccc52663121 # v4.1.0 | ||
| - name: Setup Node.js | ||
| uses: actions/setup-node@v4 |
There was a problem hiding this comment.
This should be pinned to a hash and match the other workflows:
| uses: actions/setup-node@v4 | |
| uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0 |
| registry: ghcr.io | ||
| username: ${{ github.actor }} | ||
| password: ${{ secrets.GITHUB_TOKEN }} | ||
| node-version: "20" |
There was a problem hiding this comment.
There are two LTS releases after this one that are currently active, let's not target older Node.js versions if we can avoid it.
| node-version: "20" | |
| node-version: "24" |
| echo "RHDH did not become ready in time." | ||
| exit 1 | ||
| curl -fsSL \ | ||
| "https://raw.githubusercontent.com/redhat-developer/rhdh/3efb9cc140ff/scripts/install-dynamic-plugins/install-dynamic-plugins.py" \ |
There was a problem hiding this comment.
Is this supposed to be pinned to a specific commit?
There was a problem hiding this comment.
this will get replaced with the ts version after redhat-developer/rhdh#4574 gets merged anyway...
| "type": "module", | ||
| "description": "Lightweight smoke test harness for RHDH dynamic plugins — no Docker required", | ||
| "scripts": { | ||
| "test": "node smoke-test.mjs", |
There was a problem hiding this comment.
Looks like this script is not triggered in the CI, but rather executed manually, let's use npm test for consistency.
| "@backstage/backend-dynamic-feature-service": "0.7.9", | ||
| "@backstage/backend-plugin-api": "1.7.0", | ||
| "@backstage/backend-test-utils": "1.4.0", | ||
| "@backstage/cli-node": "0.2.18", | ||
| "@backstage/config": "1.3.6", | ||
| "@backstage/config-loader": "1.10.8", | ||
| "@backstage/errors": "1.2.7", | ||
| "@backstage/types": "1.2.2", | ||
| "js-yaml": "4.1.0" |
There was a problem hiding this comment.
Let's not rely on pinned versions here, but use caret ranges ^ and a lockfile.
There was a problem hiding this comment.
There is no need to use .mjs, you can use .js and Node will be able to detect it, also because the type is already set in the package.
There was a problem hiding this comment.
Node.js can run TypeScript these days, so using .ts would also work, and make your code type-safe.
| // CLI | ||
| // --------------------------------------------------------------------------- | ||
|
|
||
| function parseArgs(argv) { |
There was a problem hiding this comment.
Node.js has built-in support for parsing CLI arguments, there is no need to implement this yourself.
| return args; | ||
| } | ||
|
|
||
| function loadEnvFile(filePath) { |
There was a problem hiding this comment.
Node.js has built-in support for parsing .env files, so there is no need to implement this yourself.
| // Config helpers | ||
| // --------------------------------------------------------------------------- | ||
|
|
||
| function deepMerge(src, dst) { |
There was a problem hiding this comment.
The custom deepMerge and loadConfigs can be removed entirely — Backstage's config loader already handles deep merging of multiple --config files. Just push the config paths onto process.argv before calling startTestBackend() and let the default rootConfig service pick them up. This is the same approach the old Docker-based smoke test used (chaining --config flags on the node packages/backend command).
This also removes the need for mockServices.rootConfig.factory({ data }).
Boot a minimal Backstage backend directly on the runner using createBackend() + dynamicPluginsFeatureLoader, probe /api/<pluginId> routes, and report results as structured JSON. Includes core bundled plugins (catalog, auth, permission, scaffolder, events, search, proxy) so dynamic plugins resolve their dependencies correctly. Made-with: Cursor
…aded-plugin probe Add two-layer frontend plugin validation to the smoke test harness: - Layer 1: static validation of dist-scalprum/ after OCI download - Layer 2: inline backend probe plugin querying dynamicPluginsServiceRef Made-with: Cursor
…to install-dynamic-plugins.py Replace createBackend() with startTestBackend() for a minimal, focused smoke test that only validates dynamic plugin loading. Remove 23 static plugin dependencies and the in-process OCI download logic in favor of the upstream install-dynamic-plugins.py script fetched at CI time. Made-with: Cursor Signed-of-by: rlan@redhat.com
…d smoke-test file
cf2f1a7 to
c503446
Compare
|



Boot a minimal Backstage backend directly on the runner using
createBackend()+dynamicPluginsFeatureLoader, probe/api/<pluginId>routes, and report results as structured JSON. Includes core bundled plugins (catalog, auth, permission, scaffolder, events, search, proxy) so dynamic plugins resolve their dependencies correctly.Frontend plugins are also validated via two layers:
dist-scalprum/exists and contains JavaScript files after OCI downloaddynamicPluginsServiceRefto confirm frontend plugins were registered without errorsResults are merged into a single report with per-plugin status (
pass,fail-bundle,fail-load,warn,skip) and the process exits non-zero on any failure.Made-with: Cursor